1,482 research outputs found

    Do online resources give satisfactory answers to questions about meaning and phraseology?

    Get PDF
    In this paper we explore some aspects of the differences between printed paper dictionaries and online dictionaries in the ways in which they explain meaning and phraseology. After noting the importance of the lexicon as an inventory of linguistic items and the neglect in both linguistics and lexicography of phraseological aspects of that inventory, we investigate the treatment in online resources of phraseology – in particular, the phrasal verbs wipe out and put down – and we go on to investigate a word, dope, that has undergone some dramatic meaning changes during the 20th century. In the course of discussion, we mention the new availability of corpus evidence and the technique of Corpus Pattern Analysis, which is important for linking phraseology and meaning and distinguishing normal phraseology from rare and unusual phraseology. The online resources that we discuss include Google, the Urban Dictionary (UD), and Wiktionary

    Intercultural New Media Studies: The Next Frontier in intercultural Communication

    Get PDF
    New media (ICT\u27s) are transforming communication across cultures. Despite this revolution in cross cultural contact, communication researchers have largely ignored the impact of new media on intercultural communication. This groundbreaking article defines the parameters of a new field of inquiry called Intercultural New Media Studies (INMS), which explores the intersection between ICT\u27s and intercultural communication. Composed of two research areas—(1) new media and intercultural communication theory and (2) culture and new media—INMS investigates new digital theories of intercultural contact as well as refines and expands twentieth-century intercultural communication theories, examining their salience in a digital world. INMS promises to increase our understanding of intercultural communication in a new media age and is the next frontier in intercultural communication

    Light Higgsino from Axion Dark Radiation

    Full text link
    The recent observations imply that there is an extra relativistic degree of freedom coined dark radiation. We argue that the QCD axion is a plausible candidate for the dark radiation, not only because of its extremely small mass, but also because in the supersymmetric extension of the Peccei-Quinn mechanism the saxion tends to dominate the Universe and decays into axions with a sizable branching fraction. We show that the Higgsino mixing parameter mu is bounded from above when the axions produced at the saxion decays constitute the dark radiation: mu \lesssim 300 GeV for a saxion lighter than 2m_W, and mu less than the saxion mass otherwise. Interestingly, the Higgsino can be light enough to be within the reach of LHC and/or ILC even when the other superparticles are heavy with mass about 1 TeV or higher. We also estimate the abundance of axino produced by the decays of Higgsino and saxion.Comment: 18 pages, 1 figure; published in JHE

    Radiative contribution to neutrino masses and mixing in μν\mu\nuSSM

    Full text link
    In an extension of the minimal supersymmetric standard model (popularly known as the μν\mu\nuSSM), three right handed neutrino superfields are introduced to solve the μ\mu-problem and to accommodate the non-vanishing neutrino masses and mixing. Neutrino masses at the tree level are generated through R−R-parity violation and seesaw mechanism. We have analyzed the full effect of one-loop contributions to the neutrino mass matrix. We show that the current three flavour global neutrino data can be accommodated in the μν\mu\nuSSM, for both the tree level and one-loop corrected analyses. We find that it is relatively easier to accommodate the normal hierarchical mass pattern compared to the inverted hierarchical or quasi-degenerate case, when one-loop corrections are included.Comment: 51 pages, 14 figures (58 .eps files), expanded introduction, other minor changes, references adde

    Grifonin-1: A Small HIV-1 Entry Inhibitor Derived from the Algal Lectin, Griffithsin

    Get PDF
    Background: Griffithsin, a 121-residue protein isolated from a red algal Griffithsia sp., binds high mannose N-linked glycans of virus surface glycoproteins with extremely high affinity, a property that allows it to prevent the entry of primary isolates and laboratory strains of T- and M-tropic HIV-1. We used the sequence of a portion of griffithsin's sequence as a design template to create smaller peptides with antiviral and carbohydrate-binding properties. Methodology/Results: The new peptides derived from a trio of homologous β-sheet repeats that comprise the motifs responsible for its biological activity. Our most active antiviral peptide, grifonin-1 (GRFN-1), had an EC50 of 190.8±11.0 nM in in vitro TZM-bl assays and an EC50 of 546.6±66.1 nM in p24gag antigen release assays. GRFN-1 showed considerable structural plasticity, assuming different conformations in solvents that differed in polarity and hydrophobicity. Higher concentrations of GRFN-1 formed oligomers, based on intermolecular β-sheet interactions. Like its parent protein, GRFN-1 bound viral glycoproteins gp41 and gp120 via the N-linked glycans on their surface. Conclusion: Its substantial antiviral activity and low toxicity in vitro suggest that GRFN-1 and/or its derivatives may have therapeutic potential as topical and/or systemic agents directed against HIV-1

    Statistical Inference for Valued-Edge Networks: Generalized Exponential Random Graph Models

    Get PDF
    Across the sciences, the statistical analysis of networks is central to the production of knowledge on relational phenomena. Because of their ability to model the structural generation of networks, exponential random graph models are a ubiquitous means of analysis. However, they are limited by an inability to model networks with valued edges. We solve this problem by introducing a class of generalized exponential random graph models capable of modeling networks whose edges are valued, thus greatly expanding the scope of networks applied researchers can subject to statistical analysis

    The Demographic and Socioeconomic Factors Predictive for Populations at High-Risk for La Crosse Virus Infection in West Virginia

    Get PDF
    Although a large body of literature exists for the environmental risk factors for La Crosse virus (LACV) transmission, the demographic and socioeconomic risk factors for developing LACV infection have not been investigated. Therefore, this study investigated the demographic and socioeconomic risk factors for LACV infection in West Virginia from 2003 to 2007, using two forward stepwise discriminant analyses. The discriminant analyses were used to evaluate a number of demographic and socioeconomic factors for their ability to predict: 1) those census tracts with at least one reported case of LACV infection versus those census tracts with no reported cases of LACV infection and 2) to evaluate significantly high-risk clusters for LACV infection versus significantly low-risk clusters for LACV infection. In the first model, a high school education diploma or a general education diploma or less and a lower housing densit

    New Constraints (and Motivations) for Abelian Gauge Bosons in the MeV-TeV Mass Range

    Full text link
    We survey the phenomenological constraints on abelian gauge bosons having masses in the MeV to multi-GeV mass range (using precision electroweak measurements, neutrino-electron and neutrino-nucleon scattering, electron and muon anomalous magnetic moments, upsilon decay, beam dump experiments, atomic parity violation, low-energy neutron scattering and primordial nucleosynthesis). We compute their implications for the three parameters that in general describe the low-energy properties of such bosons: their mass and their two possible types of dimensionless couplings (direct couplings to ordinary fermions and kinetic mixing with Standard Model hypercharge). We argue that gauge bosons with very small couplings to ordinary fermions in this mass range are natural in string compactifications and are likely to be generic in theories for which the gravity scale is systematically smaller than the Planck mass - such as in extra-dimensional models - because of the necessity to suppress proton decay. Furthermore, because its couplings are weak, in the low-energy theory relevant to experiments at and below TeV scales the charge gauged by the new boson can appear to be broken, both by classical effects and by anomalies. In particular, if the new gauge charge appears to be anomalous, anomaly cancellation does not also require the introduction of new light fermions in the low-energy theory. Furthermore, the charge can appear to be conserved in the low-energy theory, despite the corresponding gauge boson having a mass. Our results reduce to those of other authors in the special cases where there is no kinetic mixing or there is no direct coupling to ordinary fermions, such as for recently proposed dark-matter scenarios.Comment: 49 pages + appendix, 21 figures. This is the final version which appears in JHE

    Assessing Risk in Focal Arboviral Infections: Are We Missing the Big or Little Picture?

    Get PDF
    Focal arboviral infections affecting a subset of the overall population present an often overlooked set of challenges in the assessment and reporting of risk and the detection of spatial patterns. Our objective was to assess the variation in risk when using different at-risk populations and geographic scales for the calculation of incidence risk and the detection of geographic hot-spots of infection. We explored these variations using a pediatric arbovirus, La Crosse virus (LACV), as our model.Descriptive and cluster analyses were performed on probable and confirmed cases of LACV infections reported to the Tennessee Department of Health from 1997 to 2006, using three at-risk populations (the total population, the population 18 years and younger, and the population 15 years and younger) and at two geographic levels (county and census tract) to assess the variation in incidence risk and to investigate evidence of clustering using both global and local spatial statistics. We determined that the most appropriate at-risk population to calculate incidence risk and to assess the evidence of clustering was the population 15 years and younger. Based on our findings, the most appropriate geographical level to conduct spatial analyses and report incidence risk is the census tract level. The incidence risk in the population 15 years and younger at the county level ranged from 0 to 226.5 per 100,000 persons (median 41.5) in those counties reporting cases (n = 14) and at the census tract level it ranged from 50.9 to 673.9 per 100,000 persons (median 126.7) in those census tracts reporting cases (n = 51). To our knowledge, this is the highest reported incidence risk for this population at the county level for Tennessee and at the census tract level nationally.The results of this study indicate the possibility of missing disease clusters resulting from performing incidence risk investigations of focal diseases using inappropriate at-risk populations and/or at large geographic scales. Improved disease surveillance and health planning will result through the use of well defined at-risk populations and the use of appropriate geographic scales for the analysis and reporting of diseases. The finding of a high incidence risk of LACV infections in eastern Tennessee demonstrates that the vast majority of these infections continue to be under-diagnosed and/or underreported in this region. Persistent prevention and surveillance efforts will be required to reduce exposure to infectious vectors and to detect new cases of infection in this region. Application of this study's observations in future investigations will enhance the quantification of incidence risk and the identification of high-risk groups within the population
    • …
    corecore